This paper proposes a computationally efficient method to estimate thetime-varying relative pose between two visual-inertial sensor rigs mounted onthe flexible wings of a fixed-wing unmanned aerial vehicle (UAV). The estimatedrelative poses are used to generate highly accurate depth maps in real-time andcan be employed for obstacle avoidance in low-altitude flights or landingmaneuvers. The approach is structured as follows: Initially, a wing model isidentified by fitting a probability density function to measured deviationsfrom the nominal relative baseline transformation. At run-time, the priorknowledge about the wing model is fused in an Extended Kalman filter~(EKF)together with relative pose measurements obtained from solving a relativeperspective N-point problem (PNP), and the linear accelerations and angularvelocities measured by the two inertial measurement units (IMU) which arerigidly attached to the cameras. Results obtained from extensive syntheticexperiments demonstrate that our proposed framework is able to estimate highlyaccurate baseline transformations and depth maps.
展开▼